gusucode.com > 支持向量机工具箱 - LIBSVM OSU_SVM LS_SVM源码程序 > 支持向量机工具箱 - LIBSVM OSU_SVM LS_SVM\stprtool\svm\kperceptr.m
function [Alpha,bias,sol,t,nsv]=... kperceptr(data,labels,ker,arg,tmax) % KPERCEPTR Kernel Perceptron algorithm. % % [Alpha,bias,sol,t,nsv]=kperceptr(data,labels,ker,arg,tmax) % % Input: % data [dim x n] input patterns; dim is dimension and % n is number of patterns. % labels [1 x n] labels of patterns; 1 denotes the 1st class % 2 denotes the 2nd class. % ker [string] kernel identifier (see 'help kernel'). % arg [...] argumetn of given kernel. % tmax [int] maximal number of iterations. % % Output: % Alpha [1 x n] positive linear pattern multipliers. % bias [real] bias of the found decision function. % sol [int] 1 - Perceptron converged to the solution (zero % training classification error), 0 - Perceptron has not % converged in tmax iterations. % t [int] number of iterations. % nsv [int] number of non-zero multipliers Alpha. % % See also PERCEPTR, SVM, SVMCLASS, PSVM. % % Statistical Pattern Recognition Toolbox, Vojtech Franc, Vaclav Hlavac % (c) Czech Technical University Prague, http://cmp.felk.cvut.cz % % Modifications: % 21-Nov-2001, V. Franc if nargin < 4, error('Not enough input arguments.'); end if nargin <5, tmax = inf; end % get dimension and number of training patterns [dim, n ] = size( data ); % precompute kernel function K = kmatrix( data, ker, arg ); % make labels to be 1 and -1 Y = itosgn( labels ); % inicialize multiliers Alpha and bias Alpha = zeros( 1, n ); bias = 0; sol = 0; t = 0; while t < tmax & sol == 0, t = t + 1; % dot products < x, w > proj = Y'.*(K*(Y.*Alpha)'+bias); % find misclassified patter [dot_prod, inx ] = min( proj ); % chacks whether the adaptation is needed if dot_prod <= 0, Alpha( inx ) = Alpha( inx ) + 1; bias = Y(inx) * bias; else sol = 1; end end % compute number of nonzero Alphas nsv = length( find( Alpha )); return;